Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            On social media, teens must manage their interpersonal boundaries not only with other people, but also with the algorithms embedded in these platforms. In this context, we engaged seven teens in an Asynchronous Remote Community (ARC) as part of a multi-year Youth Advisory Board (YAB) to discuss how they navigate, cope, and co-design for improved boundary management. Teens had preconceived notions of different platforms and navigated boundaries based on specific goals; yet, they struggled when platforms lacked the granular controls needed to meet their needs. Teens enjoyed the personalization afforded by algorithms, but they felt violated when algorithms pushed unwanted content. Teens designed features for enhanced control over their discoverability and for real-time risk detection to avoid boundary turbulence. We provide design guidelines for improved social media boundary management for youth and pinpoint educational opportunities to enhance teens’ understanding and use of social media privacy settings and algorithms.more » « lessFree, publicly-accessible full text available June 23, 2026
- 
            Free, publicly-accessible full text available April 25, 2026
- 
            Researchers across various fields have investigated how users experience moderation through different perspectives and methodologies. At present, there is a pressing need of synthesizing and extracting key insights from prior literature to formulate a systematic understanding of what constitutes a moderation experience and to explore how such understanding could further inform moderation-related research and practices. To answer this question, we conducted a systematic literature review (SLR) by analyzing 42 empirical studies related to moderation experiences and published between January 2016 and March 2022. We describe these studies' characteristics and how they characterize users' moderation experiences. We further identify five primary perspectives that prior researchers use to conceptualize moderation experiences. These findings suggest an expansive scope of research interests in understanding moderation experiences and considering moderated users as an important stakeholder group to reflect on current moderation design but also pertain to the dominance of the punitive, solutionist logic in moderation and ample implications for future moderation research, design, and practice.more » « less
- 
            Transparency matters a lot to people who experience moderation on online platforms; much CSCW research has viewed offering explanations as one of the primary solutions to enhance moderation transparency. However, relatively little attention has been paid to unpacking what transparency entails in moderation design, especially for content creators. We interviewed 28 YouTubers to understand their moderation experiences and analyze the dimensions of moderation transparency. We identified four primary dimensions: participants desired the moderation system to present moderation decisions saliently, explain the decisions profoundly, afford communication with the users effectively, and offer repairment and learning opportunities. We discuss how these four dimensions are mutually constitutive and conditioned in the context of creator moderation, where the target of governance mechanisms extends beyond the content to creator careers. We then elaborate on how a dynamic, transparency perspective could value content creators' digital labor, how transparency design could support creators' learning, as well as implications for transparency design of other creator platforms.more » « less
- 
            How social media platforms could fairly conduct content moderation is gaining attention from society at large. Researchers from HCI and CSCW have investigated whether certain factors could affect how users perceive moderation decisions as fair or unfair. However, little attention has been paid to unpacking or elaborating on the formation processes of users' perceived (un)fairness from their moderation experiences, especially users who monetize their content. By interviewing 21 for-profit YouTubers (i.e., video content creators), we found three primary ways through which participants assess moderation fairness, including equality across their peers, consistency across moderation decisions and policies, and their voice in algorithmic visibility decision-making processes. Building upon the findings, we discuss how our participants' fairness perceptions demonstrate a multi-dimensional notion of moderation fairness and how YouTube implements an algorithmic assemblage to moderate YouTubers. We derive translatable design considerations for a fairer moderation system on platforms affording creator monetization.more » « less
- 
            Esports, like traditional sports, face governance challenges such as foul play and match fixing. The esports industry has seen various attempts at governance structure but is yet to form a consensus. In this study, we explore esports governance in League of Legends (LoL), a major esports title. Through a two-stage, mixed-methods analysis of rule enforcement that Riot Games, LoL's developer and publisher, has performed against esports participants such as professional players and teams, we qualitatively describe rule breaking behaviors and penalties in LoL esports, and quantitatively measure how contextual factors such as time, perpetrator identity, and region might influence governance outcomes. These findings about rule enforcement allow us to characterize the esports governance of LoL as top-down and paternalistic, and to reflect upon professional players' work and professionalization in the esports context. We conclude by discussing translatable implications for esports governance practice and research.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
